Improving Boosting Methods by Generating Specific Training and Validation Sets

نویسندگان

  • Joaquín Torres-Sospedra
  • Carlos Hernández-Espinosa
  • Mercedes Fernández-Redondo
چکیده

In previous researches it can been seen that Bagging, Boosting and Cross-Validation Committee can provide good performance separately. In this paper, Boosting methods are mixed with Bagging and Cross-Validation Committee in order to generate accurate ensembles and take benefit from all these alternatives. In this way, the networks are trained according to the boosting methods but the specific training and validation set are generated according to Bagging or Cross-Validation. The results show that the proposed methodologies BagBoosting and Cross-Validated Boosting outperform the original Boosting ensembles.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using Validation to Avoid Overfitting in Boosting Using Validation to Avoid Overfitting in Boosting

AdaBoost is a well known, effective technique for increasing the accuracy of learning algorithms. However, it has the potential to overfit the training set because it focuses on misclassified examples, which may be noisy. We demonstrate that overfitting in AdaBoost can be alleviated in a time-efficient manner using a combination of dagging and validation sets. The training set is partitioned in...

متن کامل

Improving Adaptive Boosting with k-Cross-Fold Validation

As seen in the bibliography, Adaptive Boosting (Adaboost) is one of the most known methods to increase the performance of an ensemble of neural networks. We introduce a new method based on Adaboost where we have applied Cross-Validation to increase the diversity of the ensemble. We have used CrossValidation over the whole learning set to generate an specific training set and validation set for ...

متن کامل

Bagging, Boosting, and C4.5

Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the predictive power of classiier learning systems. Both form a set of classiiers that are combined by voting, bagging by generating replicated boot-strap samples of the data, and boosting by adjusting the weights of training instances. This paper reports results of applying both techniques to a system that le...

متن کامل

oosting, a C4.5

Breiman’s bagging and Freund and Schapire’s boosting are recent methods for improving the predictive power of classifier learning systems. Both form a set of classifiers that are combined by voting, bagging by generating replicated bootstrap samples of the data, and boosting by adjusting the weights of training instances. This paper reports results of applying both techniques to a system that l...

متن کامل

Bagging Boosting and C

Breiman s bagging and Freund and Schapire s boosting are recent methods for improving the predictive power of classi er learning systems Both form a set of classi ers that are combined by voting bagging by generating replicated boot strap samples of the data and boosting by ad justing the weights of training instances This paper reports results of applying both techniques to a system that learn...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011